An Extension of Multi-Layer Perceptron Based on Layer-Topology

نویسنده

  • Janis Zuters
چکیده

There are a lot of extensions made to the classic model of multi-layer perceptron (MLP). A notable amount of them has been designed to hasten the learning process without considering the quality of generalization. The paper proposes a new MLP extension based on exploiting topology of the input layer of the network. Experimental results show the extended model to improve upon generalization capability in certain cases. The new model requires additional computational resources to compare to the classic model, nevertheless the loss in efficiency isn’t regarded to be significant. Keywords—learning algorithm, multi-layer perceptron, topology. I.INTRODUCTION eneralization is one of the main capabilities of artificial neural networks. Generalization refers to the neural network producing reasonable outputs for inputs not encountered during training [1]. A network is said to generalize well when the input-output mapping computed by the network is correct (or nearly so) for test data never used in creating or training the network. Generalization is influenced by three factors: (i) the size of the training set, and how representative it is of the environment of interest, (ii) the architecture of the neural network, and (iii) the physical complexity of the problem at hand [1]. Let’s focus on the second factor starting up by the following example (Fig. 1). (a) (b) (c) (d) Fig. 1. Four sample patterns to demonstrate human ability to differentiate Here four bitmaps of similar proportions of black and white are depicted. It’s obvious for first two bitmaps to be differentiated by humans more easily than second two. Simply speaking, that’s because of concentration of similar pixels together. Anyway the easier differentiation process is attained through observing topological character of the patterns. The goal of the paper is to propose an extension to MLP model that observes topology of input patterns to improve upon generalization capability of a neural network. Manuscript received July 15, 2005. J. Z. is with the Department of Computer Science, University of Latvia, Riga, Latvia (e-mail: [email protected]). II.THE CLASSIC MLP WITH BACK-PROPAGATION MLP is regarded to be a reference point for neural networks, and back-propagation is the basis for training supervised neural networks. The core MLP operating and training algorithm is briefly shown in (1), (2), and (3) [1], [2]. Propagation function:

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

An Extension of Multi-Layer Perceptron Based on Layer-Topology

There are a lot of extensions made to the classic model of multi-layer perceptron (MLP). A notable amount of them has been designed to hasten the learning process without considering the quality of generalization. The paper proposes a new MLP extension based on exploiting topology of the input layer of the network. Experimental results show the extended model to improve upon generalization capa...

متن کامل

A TS Fuzzy Model Derived from a Typical Multi-Layer Perceptron

In this paper, we introduce a Takagi-Sugeno (TS) fuzzy model which is derived from a typical Multi-Layer Perceptron Neural Network (MLP NN). At first, it is shown that the considered MLP NN can be interpreted as a variety of TS fuzzy model. It is discussed that the utilized Membership Function (MF) in such TS fuzzy model, despite its flexible structure, has some major restrictions. After modify...

متن کامل

An Automated MR Image Segmentation System Using Multi-layer Perceptron Neural Network

Background: Brain tissue segmentation for delineation of 3D anatomical structures from magnetic resonance (MR) images can be used for neuro-degenerative disorders, characterizing morphological differences between subjects based on volumetric analysis of gray matter (GM), white matter (WM) and cerebrospinal fluid (CSF), but only if the obtained segmentation results are correct. Due to image arti...

متن کامل

New full adders using multi-layer perceptron network

How to reconfigure a logic gate for a variety of functions is an interesting topic. In this paper, a different method of designing logic gates are proposed. Initially, due to the training ability of the multilayer perceptron neural network, it was used to create a new type of logic and full adder gates. In this method, the perceptron network was trained and then tested. This network was 100% ac...

متن کامل

A New Hybrid model of Multi-layer Perceptron Artificial Neural Network and Genetic Algorithms in Web Design Management Based on CMS

The size and complexity of websites have grown significantly during recent years. In line with this growth, the need to maintain most of the resources has been intensified. Content Management Systems (CMSs) are software that was presented in accordance with increased demands of users. With the advent of Content Management Systems, factors such as: domains, predesigned module’s development, grap...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2005